Iterative minimization of the Rayleigh quotient by block steepest descent iterations
نویسندگان
چکیده
The topic of this paper is the convergence analysis of subspace gradient iterations for the simultaneous computation of a few of the smallest eigenvalues plus eigenvectors of a symmetric and positive definite matrix pair (A,M). The methods are based on subspace iterations for A−1M and use the Rayleigh-Ritz procedure for convergence acceleration. New sharp convergence estimates are proved by generalizing estimates which have been presented for vectorial steepest descent iterations (see SIAM J. Matrix Anal. Appl., 32(2):443-456, 2011).
منابع مشابه
Convergence Analysis of Restarted Krylov Subspace Eigensolvers
The A-gradient minimization of the Rayleigh quotient allows to construct robust and fastconvergent eigensolvers for the generalized eigenvalue problem for (A,M) with symmetric and positive definite matrices. The A-gradient steepest descent iteration is the simplest case of more general restarted Krylov subspace iterations for the special case that all step-wise generated Krylov subspaces are tw...
متن کاملConvergence Analysis of Gradient Iterations for the Symmetric Eigenvalue Problem
Gradient iterations for the Rayleigh quotient are simple and robust solvers to determine a few of the smallest eigenvalues together with the associated eigenvectors of (generalized) matrix eigenvalue problems for symmetric matrices. Sharp convergence estimates for the Ritz values and Ritz vectors are derived for various steepest descent/ascent gradient iterations. The analysis shows that poores...
متن کاملMinimization of Tikhonov Functionals in Banach Spaces
Tikhonov functionals are known to be well suited for obtaining regularized solutions of linear operator equations. We analyze two iterative methods for finding the minimizer of norm-based Tikhonov functionals in Banach spaces. One is the steepest descent method, whereby the iterations are directly carried out in the underlying space, and the other one performs iterations in the dual space. We p...
متن کاملResidual norm steepest descent based iterative algorithms for Sylvester tensor equations
Consider the following consistent Sylvester tensor equation[mathscr{X}times_1 A +mathscr{X}times_2 B+mathscr{X}times_3 C=mathscr{D},]where the matrices $A,B, C$ and the tensor $mathscr{D}$ are given and $mathscr{X}$ is the unknown tensor. The current paper concerns with examining a simple and neat framework for accelerating the speed of convergence of the gradient-based iterative algorithm and ...
متن کاملThe Block Preconditioned Steepest Descent Iteration for Elliptic Operator Eigenvalue Problems
The block preconditioned steepest descent iteration is an iterative eigensolver for subspace eigenvalue and eigenvector computations. An important area of application of the method is the approximate solution of mesh eigenproblems for self-adjoint and elliptic partial differential operators. The subspace iteration allows to compute some of the smallest eigenvalues together with the associated i...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Numerical Lin. Alg. with Applic.
دوره 21 شماره
صفحات -
تاریخ انتشار 2014